3,591 research outputs found

    Ab Initio Modeling of Ecosystems with Artificial Life

    Get PDF
    Artificial Life provides the opportunity to study the emergence and evolution of simple ecosystems in real time. We give an overview of the advantages and limitations of such an approach, as well as its relation to individual-based modeling techniques. The Digital Life system Avida is introduced and prospects for experiments with ab initio evolution (evolution "from scratch"), maintenance, as well as stability of ecosystems are discussed.Comment: 13 pages, 2 figure

    Self-organized Criticality in Living Systems

    Full text link
    We suggest that ensembles of self-replicating entities such as biological systems naturally evolve into a self-organized critical state in which fluctuations, as well as waiting-times between phase transitions are distributed according to a 1/f power law. We demonstrate these concepts by analyzing a population of self-replicating strings (segments of computer-code) subject to mutation and survival of the fittest.Comment: 8 p., tar-compressed uuencoded postscript incl. figures, submitted to Phys. Rev. Let

    The use of Minimal Spanning Tree to characterize the 2D cluster galaxy distribution

    Get PDF
    We use the Minimal Spanning Tree to characterize the aggregation level of given sets of points. We test 3 distances based on the histogram of the MST edges to discriminate between the distributions. We calibrate the method by using artificial sets following Poisson, King or NFW distributions. The distance using the mean, the dispersion and the skewness of the histogram of MST edges provides the more efficient results. We apply this distance to a subsample of the ENACS clusters and we show that the bright galaxies are significantly more aggregated than the faint ones. The contamination provided by uniformly distributed field galaxies is neglectible. On the other hand, we show that the presence of clustered groups on the same cluster line of sight masked the variation of the distance with the considered magnitude.Comment: 9 pages, 7 postscript figures, LateX A\{&}A, accepted in A\{&}

    Group analysis in the SSRS2 catalog

    Get PDF
    We present an automated method to detect populations of groups in galaxy redshift catalogs. This method uses both analysis of the redshift distribution along lines of sight in fixed cells to detect elementary structures and a friend-of-friend algorithm to merge these elementary structures into physical structures. We apply this method to the SSRS2 galaxy redshift catalog. The groups detected with our method are similar to group catalogs detected with pure friend-of-friend algorithms. They have similar mass distribution, similar abundance versus redshift, similar 2-point correlation function and the same redshift completeness limit, close to 5000 km/s. If instead of SSRS2, we use catalogs of new generation, it would lead to a completeness limit of z∌\sim0.7. We model the luminosity function for nearby galaxy groups by a Schechter function with parameters M*=(-19.99+/-0.36)+5logh and alpha=-1.46 +/- 0.17 to compute the mass to light ratio. The median value of the mass to light ratio is 360 h M/L and we deduce a relation between mass to light ratio and velocity dispersion sigma (M/L=3.79 +/- 0.64)sigma -(294 +/- 570)). The more massive the group, the higher the mass to light ratio, and therefore, the larger the amount of dark matter inside the group. Another explanation is a significant stripping of the gas of the galaxies in massive groups as opposed to low mass groups. This extends to groups of galaxies the mild tendency already detected for rich clusters of galaxies. Finally, we detect a barely significant fundamental plane for these groups but much less narrow than for clusters of galaxies.Comment: 8 pages, 5 figures, accepted in A&A, shortened abstrac

    Limitations of the use of pressure waves to verify correct epidural needle position in dogs

    Get PDF
    The use of pressure waves to confirm the correct position of the epidural needle has been described in several domestic species and proposed as a valid alternative to standard methods, namely, control radiographic exam and fluoroscopy. The object of this retrospective clinical study was to evaluate the sensitivity of the epidural pressure waves as a test to verify the correct needle placement in the epidural space in dogs, in order to determine whether this technique could be useful not only in the clinical setting but also when certain knowledge of needle’s tip position is required, for instance when performing clinical research focusing on epidural anaesthesia. Of the 54 client-owned dogs undergoing elective surgeries and enrolled in this retrospective study, only 45% showed epidural pressure waves before and after epidural injection. Twenty-six percent of the animals showed epidural pressure waves only after the injection, whereas 29% of the dogs showed epidural pressure waves neither before nor after injection and were defined as false negatives. Our results show that the epidural pressure wave technique to verify epidural needle position lacks sensitivity, resulting in many false negatives. As a consequence, the applicability of this technique is limited to situations in which precise, exact knowledge of the needle's tip position is not mandatory

    Entropic Bell inequalities

    Get PDF
    We derive entropic Bell inequalities from considering entropy Venn diagrams. These entropic inequalities, akin to the Braunstein-Caves inequalities, are violated for a quantum-mechanical Einstein-Podolsky-Rosen pair, which implies that the conditional entropies of Bell variables must be negative in this case. This suggests that the satisfaction of entropic Bell inequalities is equivalent to the non-negativity of conditional entropies as a necessary condition for separability

    On the von Neumann capacity of noisy quantum channels

    Full text link
    We discuss the capacity of quantum channels for information transmission and storage. Quantum channels have dual uses: they can be used to transmit known quantum states which code for classical information, and they can be used in a purely quantum manner, for transmitting or storing quantum entanglement. We propose here a definition of the von Neumann capacity of quantum channels, which is a quantum mechanical extension of the Shannon capacity and reverts to it in the classical limit. As such, the von Neumann capacity assumes the role of a classical or quantum capacity depending on the usage of the channel. In analogy to the classical construction, this capacity is defined as the maximum von Neumann mutual entropy processed by the channel, a measure which reduces to the capacity for classical information transmission through quantum channels (the "Kholevo capacity") when known quantum states are sent. The quantum mutual entropy fulfills all basic requirements for a measure of information, and observes quantum data-processing inequalities. We also derive a quantum Fano inequality relating the quantum loss of the channel to the fidelity of the quantum code. The quantities introduced are calculated explicitly for the quantum "depolarizing" channel. The von Neumann capacity is interpreted within the context of superdense coding, and an "extended" Hamming bound is derived that is consistent with that capacity.Comment: 15 pages RevTeX with psfig, 13 figures. Revised interpretation of capacity, added section, changed titl
    • 

    corecore